Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 48
Filtrar
1.
Sci Rep ; 14(1): 5553, 2024 03 06.
Artigo em Inglês | MEDLINE | ID: mdl-38448515

RESUMO

A person with impaired emotion recognition is not able to correctly identify facial expressions represented by other individuals. The aim of the present study is to assess eyes gaze and facial emotion recognition in a healthy population using dynamic avatars in immersive virtual reality (IVR). For the first time, the viewing of each area of interest of the face in IVR is studied by gender and age. This work in healthy people is conducted to assess the future usefulness of IVR in patients with deficits in the recognition of facial expressions. Seventy-four healthy volunteers participated in the study. The materials used were a laptop computer, a game controller, and a head-mounted display. Dynamic virtual faces randomly representing the six basic emotions plus neutral expression were used as stimuli. After the virtual human represented an emotion, a response panel was displayed with the seven possible options. Besides storing the hits and misses, the software program internally divided the faces into different areas of interest (AOIs) and recorded how long participants looked at each AOI. As regards the overall accuracy of the participants' responses, hits decreased from the youngest to the middle-aged and older adults. Also, all three groups spent the highest percentage of time looking at the eyes, but younger adults had the highest percentage. It is also noteworthy that attention to the face compared to the background decreased with age. Moreover, the hits between women and men were remarkably similar and, in fact, there were no statistically significant differences between them. In general, men paid more attention to the eyes than women, but women paid more attention to the forehead and mouth. In contrast to previous work, our study indicates that there are no differences between men and women in facial emotion recognition. Moreover, in line with previous work, the percentage of face viewing time for younger adults is higher than for older adults. However, contrary to earlier studies, older adults look more at the eyes than at the mouth.Consistent with other studies, the eyes are the AOI with the highest percentage of viewing time. For men the most viewed AOI is the eyes for all emotions in both hits and misses. Women look more at the eyes for all emotions, except for joy, fear, and anger on hits. On misses, they look more into the eyes for almost all emotions except surprise and fear.


Assuntos
Realidade Virtual , Masculino , Pessoa de Meia-Idade , Humanos , Feminino , Idoso , Emoções , Medo , 60453 , Ira
2.
Psiquiatr. biol. (Internet) ; 31(1): [100449], ene.-mar 2024.
Artigo em Espanhol | IBECS | ID: ibc-231636

RESUMO

A pesar de que en España el suicidio supone la primera causa de muerte externa, no existe a nivel nacional un plan de prevención o intervención protocolizado. Este estudio tiene como objetivo final el diseño, la implementación y la evaluación de un nuevo programa de prevención del riesgo de suicidio en el Complejo Hospitalario Universitario de Albacete, llamado «RENACE». Para ello, se describe el perfil sociodemográfico y clínico de los pacientes con ideación y con conducta suicida. Se obtuvieron los siguientes resultados: la mayoría de pacientes fueron mujeres (59%) y el grupo etario más prevalente fue el de 31 a 65 años. Entre la población infantojuvenil, predominó el rango de edad de 14 a 17 años. En cuanto al perfil clínico, el diagnóstico principal fue trastornos relacionados con traumas y factores de estrés, seguido de trastornos depresivos. (AU)


Despite the fact that suicide is the leading cause of external death in Spain, there is no protocolized prevention or intervention plan at national level. The final aim of this study is the design, implementation and evaluation of a new suicide risk prevention program at the Albacete University Hospital Complex, called "RENACE". For this purpose, the sociodemographic and clinical profile of patients with suicidal ideation and behavior was described. The following results were obtained: the majority of patients were women (59%) and the most prevalent age group was 31 to 65 years old. Among the juvenile population, the predominant age range was 14 to 17 years. Regarding the clinical profile, the main diagnosis was trauma-related disorders and stress factors, followed by depressive disorders. (AU)


Assuntos
Masculino , Feminino , Adolescente , Adulto Jovem , Adulto , Pessoa de Meia-Idade , Idoso , /métodos , /organização & administração , Suicídio/prevenção & controle , Ideação Suicida , Demografia
3.
J Imaging ; 9(10)2023 Sep 25.
Artigo em Inglês | MEDLINE | ID: mdl-37888300

RESUMO

Surface defect detection with machine learning has become an important tool in industries and a large field of study for researchers or workers in recent years. It is necessary to have a simplified source of information that helps us to better focus on one type of surface. In this systematic review, we present a classification for surface defect detection based on convolutional neural networks (CNNs) focused on surface types. Findings: Out of 253 records identified, 59 primary studies were eligible. Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, we analyzed the structures of each study and the concepts related to defects and their types on surfaces. The presented review is mainly focused on finding a classification for the types of surfaces most used in industry (metal, building, ceramic, wood, and special). We delve into the specifics of each surface category, offering illustrative examples of their applications within both industrial and laboratory settings. Furthermore, we propose a new taxonomy of machine learning based on the obtained results and collected information. We summarized the studies and extracted the main characteristics such as type of surface, problem types, timeline, type of network, techniques, and datasets. Among the most relevant results of our analysis, we found that the metallic surface is the most used, as it is the one found in 62.71% of the studies, and the most prevalent problem type is classification, accounting for 49.15% of the total. Furthermore, we observe that transfer learning was employed in 83.05% of the studies, while data augmentation was utilized in 59.32%. Our findings also provide insights into the cameras most frequently employed, along with the strategies adopted to address illumination challenges present in certain articles and the approach to creating datasets for real-world applications. The main results presented in this review allow for a quick and efficient search of information for researchers and professionals interested in improving the results of their defect detection projects. Finally, we analyzed the trends that could open new fields of study for future research in the area of surface defect detection.

4.
J Imaging ; 9(10)2023 Sep 25.
Artigo em Inglês | MEDLINE | ID: mdl-37888301

RESUMO

This paper presents a systematic review of articles on computer-vision-based flying obstacle detection with a focus on midair collision avoidance. Publications from the beginning until 2022 were searched in Scopus, IEEE, ACM, MDPI, and Web of Science databases. From the initial 647 publications obtained, 85 were finally selected and examined. The results show an increasing interest in this topic, especially in relation to object detection and tracking. Our study hypothesizes that the widespread access to commercial drones, the improvements in single-board computers, and their compatibility with computer vision libraries have contributed to the increase in the number of publications. The review also shows that the proposed algorithms are mainly tested using simulation software and flight simulators, and only 26 papers report testing with physical flying vehicles. This systematic review highlights other gaps to be addressed in future work. Several identified challenges are related to increasing the success rate of threat detection and testing solutions in complex scenarios.

5.
Internet Interv ; 34: 100679, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37822788

RESUMO

Background: Anxiety in university students can lead to poor academic performance and even dropout. The Adult Manifest Anxiety Scale (AMAS-C) is a validated measure designed to assess the level and nature of anxiety in college students. Objective: The aim of this study is to provide internet-based alternatives to the AMAS-C in the automated identification and prediction of anxiety in young university students. Two anxiety prediction methods, one based on facial emotion recognition and the other on text emotion recognition, are described and validated using the AMAS-C Test Anxiety, Lie and Total Anxiety scales as ground truth data. Methods: The first method analyses facial expressions, identifying the six basic emotions (anger, disgust, fear, happiness, sadness, surprise) and the neutral expression, while the students complete a technical skills test. The second method examines emotions in posts classified as positive, negative and neutral in the students' profile on the social network Facebook. Both approaches aim to predict the presence of anxiety. Results: Both methods achieved a high level of precision in predicting anxiety and proved to be effective in identifying anxiety disorders in relation to the AMAS-C validation tool. Text analysis-based prediction showed a slight advantage in terms of precision (86.84 %) in predicting anxiety compared to face analysis-based prediction (84.21 %). Conclusions: The applications developed can help educators, psychologists or relevant institutions to identify at an early stage those students who are likely to fail academically at university due to an anxiety disorder.

6.
Int J Neural Syst ; 33(10): 2350053, 2023 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-37746831

RESUMO

Facial affect recognition is a critical skill in human interactions that is often impaired in psychiatric disorders. To address this challenge, tests have been developed to measure and train this skill. Recently, virtual human (VH) and virtual reality (VR) technologies have emerged as novel tools for this purpose. This study investigates the unique contributions of different factors in the communication and perception of emotions conveyed by VHs. Specifically, it examines the effects of the use of action units (AUs) in virtual faces, the positioning of the VH (frontal or mid-profile), and the level of immersion in the VR environment (desktop screen versus immersive VR). Thirty-six healthy subjects participated in each condition. Dynamic virtual faces (DVFs), VHs with facial animations, were used to represent the six basic emotions and the neutral expression. The results highlight the important role of the accurate implementation of AUs in virtual faces for emotion recognition. Furthermore, it is observed that frontal views outperform mid-profile views in both test conditions, while immersive VR shows a slight improvement in emotion recognition. This study provides novel insights into the influence of these factors on emotion perception and advances the understanding and application of these technologies for effective facial emotion recognition training.


Assuntos
Reconhecimento Facial , Imersão , Humanos , Emoções , Voluntários Saudáveis
7.
Sci Rep ; 13(1): 6007, 2023 04 12.
Artigo em Inglês | MEDLINE | ID: mdl-37045889

RESUMO

The negative, mood-congruent cognitive bias described in depression, as well as excessive rumination, have been found to interfere with emotional processing. This study focuses on the assessment of facial recognition of emotions in patients with depression through a new set of dynamic virtual faces (DVFs). The sample consisted of 54 stable patients compared to 54 healthy controls. The experiment consisted in an emotion recognition task using non-immersive virtual reality (VR) with DVFs of six basic emotions and neutral expression. Patients with depression showed a worst performance in facial affect recognition compared to healthy controls. Age of onset was negatively correlated with emotion recognition and no correlation was observed for duration of illness or number of lifetime hospitalizations. There was no correlation for the depression group between emotion recognition and degree of psychopathology, excessive rumination, degree of functioning, or quality of life. Hence, it is important to improve and validate VR tools for emotion recognition to achieve greater methodological homogeneity of studies and to be able to establish more conclusive results.


Assuntos
Depressão , Reconhecimento Facial , Humanos , Qualidade de Vida , Expressão Facial , Emoções
9.
Sensors (Basel) ; 24(1)2023 Dec 31.
Artigo em Inglês | MEDLINE | ID: mdl-38203095

RESUMO

Defect detection is a key element of quality control in today's industries, and the process requires the incorporation of automated methods, including image sensors, to detect any potential defects that may occur during the manufacturing process. While there are various methods that can be used for inspecting surfaces, such as those of metal and building materials, there are only a limited number of techniques that are specifically designed to analyze specialized surfaces, such as ceramics, which can potentially reveal distinctive anomalies or characteristics that require a more precise and focused approach. This article describes a study and proposes an extended solution for defect detection on ceramic pieces within an industrial environment, utilizing a computer vision system with deep learning models. The solution includes an image acquisition process and a labeling platform to create training datasets, as well as an image preprocessing technique, to feed a machine learning algorithm based on convolutional neural networks (CNNs) capable of running in real time within a manufacturing environment. The developed solution was implemented and evaluated at a leading Portuguese company that specializes in the manufacturing of tableware and fine stoneware. The collaboration between the research team and the company resulted in the development of an automated and effective system for detecting defects in ceramic pieces, achieving an accuracy of 98.00% and an F1-Score of 97.29%.

10.
Sensors (Basel) ; 22(22)2022 Nov 17.
Artigo em Inglês | MEDLINE | ID: mdl-36433482

RESUMO

This article introduces a systematic review on arousal classification based on electrodermal activity (EDA) and machine learning (ML). From a first set of 284 articles searched for in six scientific databases, fifty-nine were finally selected according to various criteria established. The systematic review has made it possible to analyse all the steps to which the EDA signals are subjected: acquisition, pre-processing, processing and feature extraction. Finally, all ML techniques applied to the features of these signals for arousal classification have been studied. It has been found that support vector machines and artificial neural networks stand out within the supervised learning methods given their high-performance values. In contrast, it has been shown that unsupervised learning is not present in the detection of arousal through EDA. This systematic review concludes that the use of EDA for the detection of arousal is widely spread, with particularly good results in classification with the ML methods found.


Assuntos
Resposta Galvânica da Pele , Aprendizado de Máquina , Nível de Alerta , Redes Neurais de Computação , Máquina de Vetores de Suporte
11.
Front Psychol ; 13: 934880, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-36312091

RESUMO

This paper explores the key factors influencing mental health professionals' behavioral intention to adopt virtual humans as a means of affect recognition training. Therapies targeting social cognition deficits are in high demand given that these deficits are related to a loss of functioning and quality of life in several neuropsychiatric conditions such as schizophrenia, autism spectrum disorders, affective disorders, and acquired brain injury. Therefore, developing new therapies would greatly improve the quality of life of this large cohort of patients. A questionnaire based on the second revision of the Unified Theory of Acceptance and Use of Technology (UTAUT2) questionnaire was used for this study. One hundred and twenty-four mental health professionals responded to the questionnaire after viewing a video presentation of the system. The results confirmed that mental health professionals showed a positive intention to use virtual reality tools to train affect recognition, as they allow manipulation of social interaction with patients. Further studies should be conducted with therapists from other countries to reach more conclusions.

12.
Int J Neural Syst ; 32(10): 2250041, 2022 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-35881017

RESUMO

The assessment of physiological signals such as the electroencephalography (EEG) has become a key point in the research area of emotion detection. This study compares the performance of two EEG devices, a low-cost brain-computer interface (BCI) (Emotiv EPOC+) and a high-end EEG (BrainVision), for the detection of four emotional conditions over 20 participants. For that purpose, signals were acquired with both devices under the same experimental procedure, and a comparison was made under three different scenarios, according to the number of channels selected and the sampling frequency of the signals analyzed. A total of 16 statistical, spectral and entropy features were extracted from the EEG recordings. A statistical analysis revealed a major number of statistically significant features for the high-end EEG than the BCI device under the three comparative scenarios. In addition, different machine learning algorithms were used for evaluating the classification performance of the features extracted from high-end EEG and low-cost BCI in each scenario. Artificial neural networks reported the best performance for both devices with an F[Formula: see text]-score of 75.08% for BCI and 98.78% for EEG. Although the professional EEG outcomes were higher than the low-cost BCI ones, both devices demonstrated a notable performance for the classification of the four emotional conditions.


Assuntos
Interfaces Cérebro-Computador , Algoritmos , Eletroencefalografia/métodos , Emoções/fisiologia , Humanos , Redes Neurais de Computação
13.
Int J Neural Syst ; 32(10): 2250029, 2022 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-35719085

RESUMO

The recognition of facial expression of emotions in others is essential in daily social interactions. The different areas of the face play different roles in decoding each emotion. To find out which ones are more important, the traditional approach has been to use eye-tracking devices with static pictures to capture which parts of the face people are looking at when decoding emotions. However, the ecological validity of this approach is limited because, unlike in real life, there is no movement in the face that can be used to identify the emotion. The use of virtual reality technology opens the door to new experiences in which the users perceive that they are in front of dynamic virtual humans. Therefore, our main aim is to investigate whether the user's immersion in a virtual environment influences the way dynamic virtual faces are visually scanned when decoding emotions. An experiment involving 74 healthy participants was carried out. The results obtained are consistent with previous works. Having confirmed the correct functioning of our solution, it is our intention to study whether emotion recognition deficits in patients with neuropsychiatric disorders are related to the way they visually scan faces.


Assuntos
Expressão Facial , Realidade Virtual , Emoções , Humanos , Reconhecimento Psicológico
14.
Int J Neural Syst ; 32(10): 2250026, 2022 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-35469551

RESUMO

The identification of the emotional states corresponding to the four quadrants of the valence/arousal space has been widely analyzed in the scientific literature by means of multiple techniques. Nevertheless, most of these methods were based on the assessment of each brain region separately, without considering the possible interactions among different areas. In order to study these interconnections, this study computes for the first time the functional connectivity metric called cross-sample entropy for the analysis of the brain synchronization in four groups of emotions from electroencephalographic signals. Outcomes reported a strong synchronization in the interconnections among central, parietal and occipital areas, while the interactions between left frontal and temporal structures with the rest of brain regions presented the lowest coordination. These differences were statistically significant for the four groups of emotions. All emotions were simultaneously classified with a 95.43% of accuracy, overcoming the results reported in previous studies. Moreover, the differences between high and low levels of valence and arousal, taking into account the state of the counterpart dimension, also provided notable findings about the degree of synchronization in the brain within different emotional conditions and the possible implications of these outcomes from a psychophysiological point of view.


Assuntos
Eletroencefalografia , Emoções , Nível de Alerta/fisiologia , Encéfalo/fisiologia , Emoções/fisiologia
15.
Sensors (Basel) ; 22(2)2022 Jan 10.
Artigo em Inglês | MEDLINE | ID: mdl-35062474

RESUMO

Augmented humanity (AH) is a term that has been mentioned in several research papers. However, these papers differ in their definitions of AH. The number of publications dealing with the topic of AH is represented by a growing number of publications that increase over time, being high impact factor scientific contributions. However, this terminology is used without being formally defined. The aim of this paper is to carry out a systematic mapping review of the different existing definitions of AH and its possible application areas. Publications from 2009 to 2020 were searched in Scopus, IEEE and ACM databases, using search terms "augmented human", "human augmentation" and "human 2.0". Of the 16,914 initially obtained publications, a final number of 133 was finally selected. The mapping results show a growing focus on works based on AH, with computer vision being the index term with the highest number of published articles. Other index terms are wearable computing, augmented reality, human-robot interaction, smart devices and mixed reality. In the different domains where AH is present, there are works in computer science, engineering, robotics, automation and control systems and telecommunications. This review demonstrates that it is necessary to formalize the definition of AH and also the areas of work with greater openness to the use of such concept. This is why the following definition is proposed: "Augmented humanity is a human-computer integration technology that proposes to improve capacity and productivity by changing or increasing the normal ranges of human function through the restoration or extension of human physical, intellectual and social capabilities".


Assuntos
Realidade Aumentada , Robótica , Automação , Humanos
16.
Sensors (Basel) ; 21(21)2021 Nov 03.
Artigo em Inglês | MEDLINE | ID: mdl-34770631

RESUMO

Physical activity contributes to the maintenance of health conditions and functioning. However, the percentage of older adults who comply with the recommendations for physical activity levels is low when compared to the same percentages on younger groups. The SmartWalk system aims to encourage older adults to perform physical activity (i.e., walking in the city), which is monitored and adjusted by healthcare providers for best results. The study reported in this article focused on the implementation of SmartWalk security services to keep personal data safe during communications and while at rest, which were validated considering a comprehensive use case. The security framework offers various mechanisms, including an authentication system that was designed to complement the pairs of usernames and passwords with trusted execution environments and token-based features, authorization with different access levels, symmetric and asymmetric key cryptography, critical transactions review, and logging supported by blockchain technology. The resulting implementation contributes for a common understanding of the security features of trustful smart cities' applications, which conforms with existing legislation and regulations.


Assuntos
Blockchain , Aplicativos Móveis , Telemedicina , Segurança Computacional , Confidencialidade , Exercício Físico
17.
Front Psychol ; 12: 675515, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34335388

RESUMO

Purpose: The purpose of this study was to determine the optimal interpersonal distance (IPD) between humans and affective avatars in facial affect recognition in immersive virtual reality (IVR). The ideal IPD is the one in which the humans show the highest number of hits and the shortest reaction times in recognizing the emotions displayed by avatars. The results should help design future therapies to remedy facial affect recognition deficits. Methods: A group of 39 healthy volunteers participated in an experiment in which participants were shown 65 dynamic faces in IVR and had to identify six basic emotions plus neutral expression presented by the avatars. We decided to limit the experiment to five different distances: D1 (35 cm), D2 (55 cm), D3 (75 cm), D4 (95 cm), and D5 (115 cm), all belonging to the intimate and personal interpersonal spaces. Of the total of 65 faces, 13 faces were presented for each of the included distances. The views were shown at different angles: 50% in frontal view, 25% from the right profile, and 25% from the left profile. The order of appearance of the faces presented to each participant was randomized. Results: The overall success rate in facial emotion identification was 90.33%, being D3 the IPD with the best overall emotional recognition hits, although statistically significant differences could not be found between the IPDs. Consistent with results obtained in previous studies, identification rates for negative emotions were higher with increasing IPD, whereas the recognition task improved for positive emotions when IPD was closer. In addition, the study revealed irregular behavior in the facial detection of the emotion surprise. Conclusions: IVR allows us to reliably assess facial emotion recognition using dynamic avatars as all the IPDs tested showed to be effective. However, no statistically significant differences in facial emotion recognition were found among the different IPDs.

18.
Artigo em Inglês | MEDLINE | ID: mdl-34073728

RESUMO

Perinatal death is the death of a baby that occurs between the 22nd week of pregnancy (or when the baby weighs more than 500 g) and 7 days after birth. After perinatal death, parents experience the process of perinatal grief. Midwives and nurses can develop interventions to improve the perinatal grief process. The aim of this review was to determine the efficacy of nursing interventions to facilitate the process of grief as a result of perinatal death. A systematic review of the literature was carried out. Studies that met the selection criteria underwent a quality assessment using the Joanna Briggs Institute critical appraisal tool. Four articles were selected out of the 640 found. Two are quasi-experimental studies, and two are randomized controlled clinical studies. The interventions that were analyzed positively improve psychological self-concept and role functions, as well as mutual commitment, depression, post-traumatic stress and symptoms of grief. These interventions are effective if they are carried out both before perinatal loss and after it has occurred. The support of health professionals for affected parents, their participation in the loss, expressing feelings and emotions, using distraction methods, group sessions, social support, physical activity, and family education are some of the effective interventions.


Assuntos
Morte Perinatal , Emoções , Feminino , Pesar , Humanos , Parto , Morte Perinatal/prevenção & controle , Gravidez , Apoio Social
19.
J Affect Disord ; 290: 40-51, 2021 07 01.
Artigo em Inglês | MEDLINE | ID: mdl-33991945

RESUMO

BACKGROUND: Social functioning impairment has been described in several psychiatric illness, including depressive disorders. It is associated with a deterioration in global functioning and quality of life, thus there is a growing interest in psychosocial functioning remediation interventions. This systematic review aims to review all psychotherapeutic, pharmacological and biological social functioning interventions in depressive disorders. METHODS: A systematic search was conducted on PubMed, PsycINFO and Scopus from the first articles to 2019 following the PRISMA guidelines. 72 original papers were extracted from an initial number of 1827, based on the selected eligibility criteria. RESULTS: A growing body of research was observed in the last 10 years, with most studies showing a low level of scientific evidence. The main diagnosis found was major depressive disorder and the principal social cognition domains assessed were emotional processing and attributional style. The type of intervention most found was the pharmacological one, followed by psychotherapeutic interventions classified as "non-specific. The efficacy of treatments showed an improvement in depressive symptoms and positive results for emotional processing and attributional style. LIMITATIONS: Because there is a lack of well-controlled designs and really few interventions focusing on its remediation, and low homogeneity on the assessment of social aspects across, a comparison of results and the extraction of general conclusions is quite difficult. CONCLUSIONS: Although a promising body of literature has been developed in recent years on the improvement of psychosocial functioning in patients with depressive disorders, more studies are needed to clarify relevant aspects in this area.


Assuntos
Transtorno Depressivo Maior , Cognição , Transtorno Depressivo Maior/terapia , Humanos , Qualidade de Vida , Percepção Social
20.
Ergonomics ; 64(9): 1146-1159, 2021 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-33860739

RESUMO

It is now widely recognised that aspects such as tiredness or mood state can have an impact on an individual's wellbeing. However, there also exist other less studied factors that might be influential, and whose analysis is important to maximise personal wellbeing. The aim of this study was to determine the influence of a set of 12 selected factors. Using the analysis of a 20-experiment case study by soft computing techniques the intention was to establish the most appropriate configuration for each factor to compose an optimal living environment to foster wellbeing. The analysis revealed that ambient lighting and stress level are the factors that most impact emotional wellbeing. To a lesser extent, being able to take a break, ambient temperature and ambient noise play a relatively determining role. The findings of this work can be used to establish a living environment for older persons that favours their emotional wellbeing. Practitioner summary: This study analyses the level of influence of a set of ambient factors on the emotional wellbeing of older people, conducting, to this end, a series of controlled experiments, and concluding that ambient lighting and stress level are the factors most relevant to promote a better living environment.KEY POINTSOlder adults' emotional interpretation of pictures depends on the environment and ambient factors.Ambient factors, such as lighting and stress, have a significant, positive effect on visual interpretation of stimuli and greater wellbeing.The use of soft computing techniques facilitates the quantification of the influence of factors affecting emotional wellbeing.


Assuntos
Emoções , Iluminação , Idoso , Idoso de 80 Anos ou mais , Humanos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...